Research Article | Open Access
Volume 2025 |Article ID 100091 | https://doi.org/10.1016/j.plaphe.2025.100091

Multi-modal few-shot learning for anthesis prediction of individual wheat plants

Yiting Xie ,1,2,3 Stuart J. Roy,1,3 Rhiannon K. Schilling,1,4,5 Huajian Liu 1,2

1University of Adelaide, Australia
2Australian Plant Phenomics Network, Australia
3ARC Training Centre for Future Crops Development, Australia
4South Australian Research & Development Institute, Australia
5Flinders University, Australia

Received 
17 Dec 2024
Accepted 
17 Jul 2025
Published
21 Jul 2025

Abstract

Anthesis prediction is crucial for breeding wheat. While current tools provide estimates of average anthesis at the field scale, they fail to address the needs of breeders who require accurate predictions for individual plants. Hybrid breeders have to finalize their plans for pollination at least 10 days before such flowering is due and biotechnology field trials in the United States and Australia must report to regulators 7–14 days before the first plant flowers. Currently, predicting anthesis of individual wheat plants is a labour-intensive, inefficient, and costly process. Individual wheat of the same cultivar within the same field may exhibit substantial variations in anthesis timing, due to significant variations in their immediate surroundings. In this study, we developed an efficient and cost-effective machine vision approach to predict anthesis of individual wheat plants. By integrating RGB imagery with in-situ meteorological data, our multimodal framework simplifies the anthesis prediction problem into binary or three-class classification tasks, aligning with breeders' requirements in individual wheat flowering prediction on the crucial days before anthesis. Furthermore, we incorporated a few-shot learning method to improve the model's adaptability across different growth environments and to address the challenge of limited training data. The model achieved an F1 score above 0.8 in all planting settings.

© 2019-2023   Plant Phenomics. All rights Reserved.  ISSN 2643-6515.

Back to top